john thompson
Large Language Models Can Learn Temporal Reasoning
Xiong, Siheng, Payani, Ali, Kompella, Ramana, Fekri, Faramarz
Large language models (LLMs) learn temporal concepts from the co-occurrence of related tokens in a sequence. Compared with conventional text generation, temporal reasoning, which reaches a conclusion based on mathematical, logical and commonsense knowledge, is more challenging. In this paper, we propose TempGraph-LLM, a new paradigm towards text-based temporal reasoning. To be specific, we first teach LLMs to translate the context into a temporal graph. A synthetic dataset, which is fully controllable and requires minimal supervision, is constructed for pre-training on this task. We prove in experiments that LLMs benefit from the pre-training on other tasks. On top of that, we guide LLMs to perform symbolic reasoning with the strategies of Chain of Thoughts (CoTs) bootstrapping and special data augmentation. We observe that CoTs with symbolic reasoning bring more consistent and reliable results than those using free text.
- Europe > United Kingdom > Northern Ireland (0.14)
- North America > United States > Connecticut > Hartford County > Bristol (0.04)
- Oceania > New Zealand > North Island > Auckland Region > Auckland (0.04)
- (4 more...)
Knowledge Patterns
Clark, Peter, Thompson, John, Porter, Bruce
This Chapter describes a new technique, called "knowledge patterns", for helping construct axiom-rich, formal ontologies, based on identifying and explicitly representing recurring patterns of knowledge (theory schemata) in the ontology, and then stating how those patterns map onto domain-specific concepts in the ontology. From a modeling perspective, knowledge patterns provide an important insight into the structure of a formal ontology: rather than viewing a formal ontology simply as a list of terms and axioms, knowledge patterns views it as a collection of abstract, modular theories (the "knowledge patterns") plus a collection of modeling decisions stating how different aspects of the world can be modeled using those theories. Knowledge patterns make both those abstract theories and their mappings to the domain of interest explicit, thus making modeling decisions clear, and avoiding some of the ontological confusion that can otherwise arise. In addition, from a computational perspective, knowledge patterns provide a simple and computationally efficient mechanism for facilitating knowledge reuse. We describe the technique and an application built using them, and then critique its strengths and weaknesses. We conclude that this technique enables us to better explicate both the structure and modeling decisions made when constructing a formal axiom-rich ontology.
- North America > United States > Texas > Travis County > Austin (0.14)
- North America > United States > Washington > King County > Seattle (0.04)
- Europe > Germany > Baden-Württemberg > Karlsruhe Region > Karlsruhe (0.04)
What's Next For You? How AI Is Transforming Talent Management
Bottom Line: Taking on the talent crisis with greater intelligence and insight, delivering a consistently excellent candidate experience, and making diversity and inclusion a part of their DNA differentiates growing businesses who are attracting and retaining employees. The book What's Next For You? by Ashutosh Garg, CEO and Co-Founder and Kamal Ahluwalia, President of eightfold.ai The above findings are just a sample of the depth of data-driven content and roadmap the book What's Next For You? delivers. Co-authors Ashutosh Garg's and Kamal Ahluwalia's expertise in applying AI and machine learning to talent management problems with a strong data-first mindset is evident throughout the book. What makes the book noteworthy is how the authors write from the heart first with empathy for applicants and hiring managers, supporting key points with data.
- Information Technology > Data Science > Data Mining > Big Data (0.40)
- Information Technology > Artificial Intelligence > Issues > Social & Ethical Issues (0.40)